Using Expertise for Crowd-Sourcing

نویسندگان

  • David Merritt
  • Mark S. Ackerman
  • Mark W. Newman
  • Pei-Yao Hung
  • Jacob Mandel
  • Erica Ackerman
چکیده

In this paper, we examine whether the use of expertise ratings can help crowd-sourcing systems. We show, using simulations, that a crowd-sourcing system based in social navigation works better when users’ expertise levels are taken into account.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Design Uncertainty in Crowd-Sourcing Systems

Crowd-sourcing social computing systems represent a new material for HCI designers. Crowd sourcing has been successfully applied to many areas, but these systems are difficult to reliably design and prototype. One source of uncertainty during design is the behavior of the crowd at different levels of scale. A second source of uncertainty during design is the crowd’s response to differing incent...

متن کامل

For a fistful of dollars: using crowd-sourcing to evaluate a spoken language CALL application

We present an evaluation of a Web-deployed spoken language CALL system, carried out using crowd-sourcing methods. The system, “Survival Japanese”, is a crash course in tourist Japanese implemented within the platform CALL-SLT. The evaluation was carried out over one week using the Amazon Mechanical Turk. Although we found a high proportion of attempted scammers, there was a core of 23 subjects ...

متن کامل

Active Learning and Crowd-Sourcing for Machine Translation

In recent years, corpus based approaches to machine translation have become predominant, with Statistical Machine Translation (SMT) being the most actively progressing area. Success of these approaches depends on the availability of parallel corpora. In this paper we propose Active Crowd Translation (ACT), a new paradigm where active learning and crowd-sourcing come together to enable automatic...

متن کامل

Focus Annotation of Task-based Data: Establishing the Quality of Crowd Annotation

We explore the annotation of information structure in German and compare the quality of expert annotation with crowdsourced annotation taking into account the cost of reaching crowd consensus. Concretely, we discuss a crowd-sourcing effort annotating focus in a task-based corpus of German containing reading comprehension questions and answers. Against the backdrop of a gold standard reference r...

متن کامل

The Wisdom of Crowds in Government 2.0: Information Paradigm Evolution toward Wiki-Government

This essay, exploring the peer-to-peer collaborative atmosphere penetrating Wikivism, crowd-sourcing and open-source movement, identifies a new paradigm of public information as evolution toward Wiki-government. Citizen participants can collectively create public information via various platforms enabled by Web 2.0 technologies. Under the new participatory paradigm that a large number of indivi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015